AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Multi - language retention ability

# Multi - language retention ability

Tinyllama 1.1B Python V0.1
Apache-2.0
TinyLlama is a lightweight Llama model with 1.1 billion parameters, pre - trained on 3 trillion tokens, suitable for application scenarios with limited computing resources.
Large Language Model Transformers English
T
TinyLlama
1,274
12
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase